Modified Dropout for Training Neural Network
نویسندگان
چکیده
Dropout is a method that prevents overfitting when training deep neural networks. It involves sampling different sub-networks by temporarily removing nodes at random. Dropout works well in practice, but its properties have not been fully explored or theoretically justified. Our project explores the properties of dropout by applying methods used in optimization such as simulated annealing and low discrepancy sampling in model combination. We include experimental results that demonstrate the properties of these modifications to dropout and compare them to existing versions of dropout.
منابع مشابه
Compacting Neural Network Classifiers via Dropout Training
We introduce dropout compaction, a novel method for training feed-forward neural networks which realizes the performance gains of training a large model with dropout regularization, yet extracts a compact neural network for run-time efficiency. In the proposed method, we introduce a sparsity-inducing prior on the per unit dropout retention probability so that the optimizer can effectively prune...
متن کاملHandwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملTowards dropout training for convolutional neural networks
Recently, dropout has seen increasing use in deep learning. For deep convolutional neural networks, dropout is known to work well in fully-connected layers. However, its effect in convolutional and pooling layers is still not clear. This paper demonstrates that max-pooling dropout is equivalent to randomly picking activation based on a multinomial distribution at training time. In light of this...
متن کاملImproving Neural Networks with Dropout by Nitish Srivastava
Improving Neural Networks with Dropout Nitish Srivastava Master of Science Graduate Department of Computer Science University of Toronto 2013 Deep neural nets with a huge number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining many d...
متن کاملAdversarial Dropout for Supervised and Semi-supervised Learning
Recently, training with adversarial examples, which are generated by adding a small but worst-case perturbation on input examples, has improved the generalization performance of neural networks. In contrast to the biased individual inputs to enhance the generality, this paper introduces adversarial dropout, which is a minimal set of dropouts that maximize the divergence between 1) the training ...
متن کامل